Skip to content

Conversation

@saihaj
Copy link
Owner

@saihaj saihaj commented Apr 25, 2025

The good thing about this approach is we can cut down the size of payload our server would need to process in #248 since in this approach OpenAI will manage the context which means the client can also useCompletion that way for UI it is just one shot processing.

Even with useChat it can be good idea to do but then need to figure out how before fetch make it drop all the messages. Whatever is easier

@vercel vercel bot temporarily deployed to Preview – management-doge-ai April 25, 2025 19:37 Inactive
@vercel vercel bot temporarily deployed to Preview – dogeai April 25, 2025 19:37 Inactive
@vercel
Copy link

vercel bot commented Apr 25, 2025

The latest updates on your projects. Learn more about Vercel for Git ↗︎

2 Skipped Deployments
Name Status Preview Comments Updated (UTC)
dogeai ⬜️ Skipped (Inspect) Apr 25, 2025 7:37pm
management-doge-ai ⬜️ Skipped (Inspect) Apr 25, 2025 7:37pm

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants